High-order conditional mutual information maximization for dealing with high-order dependencies in feature selection

نویسندگان

چکیده

This paper presents a novel feature selection method based on the conditional mutual information (CMI). The proposed High Order Conditional Mutual Information Maximization (HOCMIM) incorporates high order dependencies into procedure and has straightforward interpretation due to its bottom-up derivation. HOCMIM is derived from CMI's chain expansion expressed as maximization optimization problem. problem solved using greedy search procedure, which speeds up entire process. experiments are run set of benchmark datasets (20 in total). compared with eighteen state-of-the-art algorithms, results two supervised learning classifiers (Support Vector Machine K-Nearest Neighbor). achieves best terms accuracy shows be faster than counterparts.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Can high-order dependencies improve mutual information based feature selection?

Mutual information (MI) based approaches are a popular paradigm for feature selection. Most previous methods have made use of low-dimensional MI quantities that are only effective at detecting low-order dependencies between variables. Several works have considered the use of higher dimensional mutual information, but the theoretical underpinning of these approaches is not yet comprehensive. To ...

متن کامل

Efficient High-Order Interaction-Aware Feature Selection Based on Conditional Mutual Information

This study introduces a novel feature selection approach CMICOT, which is a further evolution of filter methods with sequential forward selection (SFS) whose scoring functions are based on conditional mutual information (MI). We state and study a novel saddle point (max-min) optimization problem to build a scoring function that is able to identify joint interactions between several features. Th...

متن کامل

Binary Feature Selection with Conditional Mutual Information

In a context of classi cation, we propose to use conditional mutual information to select a family of binary features which are individually discriminating and weakly dependent. We show that on a task of image classi cation, despite its simplicity, a naive Bayesian classi er based on features selected with this Conditional Mutual Information Maximization (CMIM) criterion performs as well as a c...

متن کامل

Higher Order Mutual Information Approximation for Feature Selection

Feature selection is a process of choosing a subset of relevant features so that the quality of prediction models can be improved. An extensive body of work exists on information-theoretic feature selection, based on maximizing Mutual Information (MI) between subsets of features and class labels. The prior methods use a lower order approximation, by treating the joint entropy as a summation of ...

متن کامل

Conditional Dynamic Mutual Information-Based Feature Selection

With emergence of new techniques, data in many fields are getting larger and larger, especially in dimensionality aspect. The high dimensionality of data may pose great challenges to traditional learning algorithms. In fact, many of features in large volume of data are redundant and noisy. Their presence not only degrades the performance of learning algorithms, but also confuses end-users in th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Pattern Recognition

سال: 2022

ISSN: ['1873-5142', '0031-3203']

DOI: https://doi.org/10.1016/j.patcog.2022.108895